Asynchronous Distributed Semi-Stochastic Gradient Optimization

نویسندگان

  • Ruiliang Zhang
  • Shuai Zheng
  • James T. Kwok
چکیده

With the recent proliferation of large-scale learning problems, there have been a lot of interest on distributed machine learning algorithms, particularly those that are based on stochastic gradient descent (SGD) and its variants. However, existing algorithms either suffer from slow convergence due to the inherent variance of stochastic gradients, or have a fast linear convergence rate but at the expense of poorer solution quality. In this paper, we combine their merits by proposing a fast distributed asynchronous SGD-based algorithmwith variance reduction. A constant learning rate can be used, and it is also guaranteed to converge linearly to the optimal solution. Experiments on the Google Cloud Computing Platform demonstrate that the proposed algorithm outperforms state-of-theart distributed asynchronous algorithms in terms of both wall clock time and solution quality.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Asynchronous Stochastic Gradient Descent with Variance Reduction for Non-Convex Optimization

We provide the first theoretical analysis on the convergence rate of the asynchronous stochastic variance reduced gradient (SVRG) descent algorithm on nonconvex optimization. Recent studies have shown that the asynchronous stochastic gradient descent (SGD) based algorithms with variance reduction converge with a linear convergent rate on convex problems. However, there is no work to analyze asy...

متن کامل

Centralized and Decentralized Asynchronous Optimization of Stochastic Discrete Event Systems

We propose and analyze centralized and decentralized asynchronous control structures for the parametric optimization of stochastic Discrete Event Systems (DES) consisting of K distributed components. We use a stochastic approximation type of optimization scheme driven by gradient estimates of a global performance measure with respect to local control parameters. The estimates are obtained in di...

متن کامل

Centralized and Decentralized Asynchronous Optimization of Stochastic Discrete-Event Systems - Automatic Control, IEEE Transactions on

We propose and analyze centralized and decentralized asynchronous control structures for the parametric optimization of stochastic discrete-event systems (DES) consisting of K distributed components. We use a stochastic approximation type of optimization scheme driven by gradient estimates of a global performance measure with respect to local control parameters. The estimates are obtained in di...

متن کامل

Decoupled Asynchronous Proximal Stochastic Gradient Descent with Variance Reduction

In the era of big data, optimizing large scale machine learning problems becomes a challenging task and draws significant attention. Asynchronous optimization algorithms come out as a promising solution. Recently, decoupled asynchronous proximal stochastic gradient descent (DAP-SGD) is proposed to minimize a composite function. It is claimed to be able to offload the computation bottleneck from...

متن کامل

Centralized and Decentralized Asynchronousoptimization of Stochastic Discrete

We propose and analyze centralized and decentralized asynchronous control structures for the parametric optimization of stochastic Discrete Event Systems (DES) consisting of K distributed components. We use a stochastic approximation type of optimization scheme driven by gradient estimates of a global performance measure with respect to local control parameters. The estimates are obtained in di...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2016